You queried:

markov chain "

Results retrieved for:
      Your query is not considered offensive by any official sources.

      ( some results may take a moment to update )


      [Noun]  | "Markov chain" 


      1: a usually discrete stochastic process (such as a random walk) in which the probabilities of occurrence of various future states depend only on the present state of the system or on the immediately preceding state and not on the path by which the present state was achieved —called also Markoff chain


      Origin: 1938 ;

       A. A. Markov †1922 Russian mathematician;

       No results from the Merriam-Webster Thesaurus...

       No results from Urban Dictionary API...

       No results from Words API...

       No results from Linguatools Conjugations API...

       No results from Words API...

       No results from Word Associations API...


      * Query The Library of Babel *
      * Query Wikipedia *
      * Query Google *

      * Discuss! *


      You must be signed in to post comments!


      Top comments for:
      "markov chain"